8 research outputs found

    A unified theory for weak separation properties

    Get PDF
    We devise a framework which leads to the formulation of a unified theory of normality (regularity), semi-normality (semi-regularity), s-normality (s-regularity), feebly-normality (feebly-regularity), pre-normality (pre-regularity), and others. Certain aspects of theory are given by unified proof

    Large margin classifiers based on affine hulls

    No full text
    Special Issue: 10th Brazilian Symposium on Neural Networks (SBRN2008)International audienceThis paper introduces a geometrically inspired large margin classifier that can be a better alternative to the support vector machines (SVMs) for the classification problems with limited number of training samples. In contrast to the SVM classifier, we approximate classes with affine hulls of their class samples rather than convex hulls. For any pair of classes approximated with affine hulls, we introduce two solutions to find the best separating hyperplane between them. In the first proposed formulation, we compute the closest points on the affine hulls of classes and connect these two points with a line segment. The optimal separating hyperplane between the two classes is chosen to be the hyperplane that is orthogonal to the line segment and bisects the line. The second formulation is derived by modifying the nu-SVM formulation. Both formulations are extended to the nonlinear case by using the kernel trick. Based on our findings, we also develop a geometric interpretation of the least squares SVM classifier and show that it is a special case of the proposed method. Multi-class classification problems are dealt with constructing and combining several binary classifiers as in SVM. The experiments on several databases show that the proposed methods work as good as the SVM classifier if not any better
    corecore